Integrating boosting and stochastic attribute selection committees for further improving the performance of decision tree learning
نویسندگان
چکیده
Techniques for constructing classiier committees including Boosting and Bagging have demonstrated great success, especially Boosting for decision tree learning. This type of technique generates several classiiers to form a committee by repeated application of a single base learning algorithm. The committee members vote to decide the nal classiication. Boosting and Bagging create diierent classiiers by modifying the distribution of the training set. Sasc (Stochastic Attribute Selection Committees) uses an alternative approach to generating classiier committees by stochastic manipulation of the set of attributes considered at each node during tree induction, but keeping the distribution of the training set unchanged. In this paper, we propose a method for improving the performance of Boosting. This technique combines Boosting and Sasc. It builds classiier committees by manipulating both the distribution of the training set and the set of attributes available during induction. In the synergy, Sasc eeectively increases the model diversity of Boosting. Experiments with a representative collection of natural domains show that, on average, the combined technique outperforms both Boosting and Sasc in terms of reducing the error rate of decision tree learning.
منابع مشابه
Stochastic Attribute Selection Committees withMultiple Boosting : Learning More
Classiier learning is a key technique for KDD. Approaches to learning classiier committees, including Boosting, Bagging, Sasc, and SascB, have demonstrated great success in increasing the prediction accuracy of decision trees. Boosting and Bagging create diierent classiiers by modifying the distribution of the training set. Sasc adopts a diierent method. It generates committees by stochastic ma...
متن کاملStochastic Attribute Selection Committees
Classi er committee learning methods generate multiple classi ers to form a committee by repeated application of a single base learning algorithm. The committee members vote to decide the nal classication. Two such methods, Bagging and Boosting, have shown great success with decision tree learning. They create di erent classi ers by modifying the distribution of the training set. This paper stu...
متن کاملGenerating Classifier Commitees by Stochastically Selecting both Attributes and Training Examples
Boosting and Bagging, as two representative approaches to learning classiier committees, have demonstrated great success, especially for decision tree learning. They repeatedly build diierent classiiers using a base learning algorithm by changing the distribution of the training set. Sasc, as a diierent type of committee learning method, can also signiicantly reduce the error rate of decision t...
متن کاملAssessment of distance-based multi-attribute group decision-making methods from a maintenance strategy perspective
Maintenance has been acknowledged by industrial management as a significant influencing factor of plant performance. Effective plant maintenance can be realized by developing a proper maintenance strategy. However, selecting an appropriate maintenance strategy is difficult because maintenance is a non-repetitive task such as production activity. Maintenance also does not leave a consistent trac...
متن کاملSLEAS: Supervised Learning using Entropy as Attribute Selection Measure
There is embryonic importance in scaling up the broadly used decision tree learning algorithms to huge datasets. Even though abundant diverse methodologies have been proposed, a fast tree growing algorithm without substantial decrease in accuracy and substantial increase in space complexity is essential to a greater extent. This paper aims at improving the performance of the SLIQ (Supervised Le...
متن کامل